# Multilingual Q&A
Galactic Qwen 14B Exp2
Apache-2.0
Galactic-Qwen-14B-Exp2 is a large language model based on the Qwen 2.5 14B architecture, focusing on enhanced reasoning capabilities, excelling in context understanding, logical reasoning, and multi-step problem solving.
Large Language Model
Transformers Supports Multiple Languages

G
prithivMLmods
558
4
Serpens Opus 14B Exp
Apache-2.0
Serpens-Opus-14B-Exp is a 14-billion-parameter model based on the Qwen 2.5 14B architecture, designed to enhance reasoning capabilities for general-purpose reasoning and Q&A tasks.
Large Language Model
Transformers Supports Multiple Languages

S
prithivMLmods
158
1
Lucie 7B Instruct V1.1
Apache-2.0
A multilingual causal language model fine-tuned based on Lucie-7B, supporting French and English, focused on instruction following and text generation tasks.
Large Language Model Supports Multiple Languages
L
OpenLLM-France
13.33k
9
Qwen2.5 0.5b Test Ft
Apache-2.0
Qwen 2.5 0.5B is a compact yet powerful language model, fine-tuned based on Qwen/Qwen2.5-0.5B-Instruct, supporting multiple languages with performance close to the Llama 3.2 1B model.
Large Language Model
Transformers Supports Multiple Languages

Q
KingNish
1,004
11
Llama 3.1 Openhermes Tr
Apache-2.0
A Turkish-English bilingual model fine-tuned from unsloth/llama-3-8b-bnb-4bit, optimized for training speed using Unsloth
Large Language Model
Transformers Supports Multiple Languages

L
umarigan
5,520
3
Mmedlm
Apache-2.0
MMedLM is a multilingual healthcare foundation model with 7 billion parameters, based on the InternLM architecture, pretrained on the comprehensive multilingual medical corpus MMedC.
Large Language Model
Transformers Supports Multiple Languages

M
Henrychur
30
5
My Awesome Qa Model
Apache-2.0
A question-answering model fine-tuned on the SQuAD dataset based on bert-base-multilingual-cased
Question Answering System
Transformers

M
vnktrmnb
14
0
Multilingual Bert Finetuned Xquad
Apache-2.0
A multilingual Q&A model fine-tuned on the xquad dataset based on bert-base-multilingual-cased
Question Answering System
Transformers

M
ritwikm
24
0
Mdeberta V3 Base Squad2
MIT
A multilingual Q&A model based on mDeBERTa-v3-base, fine-tuned on the SQuAD2.0 dataset
Question Answering System
Transformers Supports Multiple Languages

M
timpal0l
14.06k
246
Xlm Roberta Large Xquad
A multilingual Q&A model based on XLM-RoBERTa-large, trained on the XQuAD dataset, supporting extractive Q&A tasks in 11 languages.
Question Answering System
Transformers Other

X
alon-albalak
45
2
Xlm Roberta Qa Chaii
A multilingual Q&A model based on XLM-Roberta pre-training, supporting English, Tamil, and Hindi, optimized for Indian language Q&A scenarios
Question Answering System
Transformers Supports Multiple Languages

X
gokulkarthik
14
0
Bert Multi Cased Finetuned Xquadv1
Based on Google's BERT base multilingual model, fine-tuned on Q&A datasets in 11 languages, supporting cross-lingual Q&A tasks
Question Answering System Other
B
mrm8488
1,100
5
Bert Base Multilingual Cased Finetuned Squad
This is a question-answering model fine-tuned on the Stanford Question Answering Dataset (SQuADv1.1) based on the multilingual BERT model, supporting reading comprehension tasks in multiple languages.
Question Answering System Other
B
salti
28
14
Squad Mbart Model
An mbart model trained from scratch on an unknown dataset, specific uses and features require further details
Question Answering System
Transformers

S
ZYW
18
0
Xlmroberta Squadv2
This is an xlm-roberta-large model fine-tuned on the SQuADv2 dataset for question answering tasks
Question Answering System
Transformers

X
aware-ai
15
0
Xlm Roberta Large Finetuned Squad V2
MIT
A Q&A model fine-tuned on the squad_v2 dataset based on the xlm-roberta-large model
Question Answering System
Transformers

X
sontn122
67
0
Xlm Roberta Large Qa Multilingual Finedtuned Ru
Apache-2.0
This is a pretrained model based on the XLM-RoBERTa architecture, trained with masked language modeling objectives and fine-tuned on English and Russian question answering datasets.
Question Answering System
Transformers Supports Multiple Languages

X
AlexKay
1,814
48
Mt5 Base Finetuned Tydiqa Xqa
This model is a multilingual Q&A model fine-tuned on the TyDi QA dataset based on Google's mT5-base, supporting Q&A tasks in 101 languages.
Question Answering System
Transformers Other

M
Narrativa
368
6
Featured Recommended AI Models